163 research outputs found
Recommended from our members
Machine learning based interatomic potential for amorphous carbon
We introduce a Gaussian approximation potential (GAP) for atomistic simulations of liquid and amorphous elemental carbon. Based on a machine learning representation of the density-functional theory (DFT) potential-energy surface, such interatomic potentials enable materials simulations with close-to DFT accuracy but at much lower computational cost. We first determine the maximum accuracy that any finite-range potential can achieve in carbon structures; then, using a hierarchical set of two-, three-, and many-body structural descriptors, we construct a GAP model that can indeed reach the target accuracy. The potential yields accurate energetic and structural properties over a wide range of densities; it also correctly captures the structure of the liquid phases, at variance with a state-of-the-art empirical potential. Exemplary applications of the GAP model to surfaces of “diamondlike” tetrahedral amorphous carbon (-C) are presented, including an estimate of the amorphous material’s surface energy and simulations of high-temperature surface reconstructions (“graphitization”). The presented interatomic potential appears to be promising for realistic and accurate simulations of nanoscale amorphous carbon structures.V.L.D. gratefully acknowledges a postdoctoral fellowship from the Alexander von Humboldt Foundation and support from the Isaac Newton Trust (Trinity College Cambridge). This work used the ARCHER UK National Supercomputing Service (http://www.archer.ac.uk) via EPSRC Grant No. EP/K014560/1
Recommended from our members
Many-Body Dispersion Correction Effects on Bulk and Surface Properties of Rutile and Anatase TiO
Titanium dioxide (titania, TiO) is a widely studied material with diverse applications. Here, we explore how pairwise and many-body descriptions of van der Waals dispersion interactions perform in atomistic modeling of the two most important TiO polymorphs, rutile and anatase. In particular, we obtain an excellent description of both bulk structures from density-functional theory (DFT) computations with the many-body dispersion (MBD) method of Tkatchenko and co-workers coupled to an iterative Hirshfeld partitioning scheme ("Hirshfeld-I"). Beyond the bulk, we investigate the most important crystal surfaces, namely, rutile (110), (101), and (100) and anatase (101), (100), and (001). Dispersion has a highly anisotropic effect on the different () surfaces; this directly changes the predicted nanocrystal morphology as determined from Wulff constructions. The periodic DFT+MBD method combined with Hirshfeld-I partitioning appears to be promising for future large-scale atomistic studies of this technologically important material.V.L.D. gratefully acknowledges a postdoctoral fellowship from the Alexander von Humboldt Foundation. This work used the ARCHER UK National Supercomputing Service, access to which was granted via support for the UKCP consortium (Engineering and Physical Sciences Research Council Grant EP/K014560/1)
Recommended from our members
Understanding the thermal properties of amorphous solids using machine-learning-based interatomic potentials
© 2018 Informa UK Limited, trading as Taylor & Francis Group. Understanding the thermal properties of disordered systems is of fundamental importance for condensed matter physics - and for practical applications as well. While quantities such as the thermal conductivity are usually well characterised experimentally, their microscopic origin is often largely unknown - hence the pressing need for molecular simulations. However, the time and length scales involved with thermal transport phenomena are typically well beyond the reach of ab initio calculations. On the other hand, many amorphous materials are characterised by a complex structure, which prevents the construction of classical interatomic potentials. One way to get past this deadlock is to harness machine-learning (ML) algorithms to build interatomic potentials: these can be nearly as computationally efficient as classical force fields while retaining much of the accuracy of first-principles calculations. Here, we discuss neural network potentials (NNPs) and Gaussian approximation potentials (GAPs), two popular ML frameworks. We review the work that has been devoted to investigate, via NNPs, the thermal properties of phase-change materials, systems widely used in non-volatile memories. In addition, we present recent results on the vibrational properties of amorphous carbon, studied via GAPs. In light of these results, we argue that ML-based potentials are among the best options available to further our understanding of the vibrational and thermal properties of complex amorphous solids
Recommended from our members
First-principles study of alkali-metal intercalation in disordered carbon anode materials
The intercalation of alkali metals in disordered carbon anode materials is studied by a combination of first-principles and machine-learning methods.</p
An accurate and transferable machine learning potential for carbon
We present an accurate machine learning (ML) model for atomistic simulations of carbon, constructed using the Gaussian approximation potential (GAP) methodology. The potential, named GAP-20, describes the properties of the bulk crystalline and amorphous phases, crystal surfaces, and defect structures with an accuracy approaching that of direct ab initio simulation, but at a significantly reduced cost. We combine structural databases for amorphous carbon and graphene, which we extend substantially by adding suitable configurations, for example, for defects in graphene and other nanostructures. The final potential is fitted to reference data computed using the optB88-vdW density functional theory (DFT) functional. Dispersion interactions, which are crucial to describe multilayer carbonaceous materials, are therefore implicitly included. We additionally account for long-range dispersion interactions using a semianalytical two-body term and show that an improved model can be obtained through an optimization of the many-body smooth overlap of atomic positions descriptor. We rigorously test the potential on lattice parameters, bond lengths, formation energies, and phonon dispersions of numerous carbon allotropes. We compare the formation energies of an extensive set of defect structures, surfaces, and surface reconstructions to DFT reference calculations. The present work demonstrates the ability to combine, in the same ML model, the previously attained flexibility required for amorphous carbon [V. L. Deringer and G. Csányi, Phys. Rev. B 95, 094203 (2017)] with the high numerical accuracy necessary for crystalline graphene [Rowe et al., Phys. Rev. B 97, 054303 (2018)], thereby providing an interatomic potential that will be applicable to a wide range of applications concerning diverse forms of bulk and nanostructured carbon
Gaussian Process Regression for Materials and Molecules.
We provide an introduction to Gaussian process regression (GPR) machine-learning methods in computational materials science and chemistry. The focus of the present review is on the regression of atomistic properties: in particular, on the construction of interatomic potentials, or force fields, in the Gaussian Approximation Potential (GAP) framework; beyond this, we also discuss the fitting of arbitrary scalar, vectorial, and tensorial quantities. Methodological aspects of reference data generation, representation, and regression, as well as the question of how a data-driven model may be validated, are reviewed and critically discussed. A survey of applications to a variety of research questions in chemistry and materials science illustrates the rapid growth in the field. A vision is outlined for the development of the methodology in the years to come
Lattice thermal expansion and anisotropic displacements in {\alpha}-sulfur from diffraction experiments and first-principles theory
Thermal properties of solid-state materials are a fundamental topic of study
with important practical implications. For example, anisotropic displacement
parameters (ADPs) are routinely used in physics, chemistry, and crystallography
to quantify the thermal motion of atoms in crystals. ADPs are commonly derived
from diffraction experiments, but recent developments have also enabled their
first-principles prediction using periodic density functional theory (DFT).
Here, we combine experiments and dispersion-corrected DFT to quantify lattice
thermal expansion and ADPs in crystalline {\alpha}-sulfur (S8), a prototypical
elemental solid that is controlled by the interplay of covalent and van der
Waals interactions. We first report on single-crystal and powder X-ray
diffraction (XRD) measurements that provide new and improved reference data
from 10 K up to room temperature. We then use several popular
dispersion-corrected DFT methods to predict vibrational and thermal properties
of {\alpha}-sulfur, including the anisotropic lattice thermal expansion.
Hereafter, ADPs are derived in the commonly used harmonic approximation (in the
computed zero-Kelvin structure) and also in the quasi-harmonic approximation
(QHA) which takes the predicted lattice thermal expansion into account. At the
PBE+D3(BJ) level, the latter leads to excellent agreement with experiments.
Finally, more general implications of this study for realistic materials
modeling at finite temperature are discussed
Mapping Materials and Molecules
The visualization of data is indispensable in scientific research, from the early stages when human insight forms to the final step of communicating results. In computational physics, chemistry and materials science, it can be as simple as making a scatter plot or as straightforward as looking through the snapshots of atomic positions manually. However, as a result of the “big data” revolution, these conventional approaches are often inadequate. The widespread adoption of high-throughput computation for materials discovery and the associated community-wide repositories have given rise to data sets that contain an enormous number of compounds and atomic configurations. A typical data set contains thousands to millions of atomic structures, along with a diverse range of properties such as formation energies, band gaps, or bioactivities.
It would thus be desirable to have a data-driven and automated framework for visualizing and analyzing such structural data sets. The key idea is to construct a low-dimensional representation of the data, which facilitates navigation, reveals underlying patterns, and helps to identify data points with unusual attributes. Such data-intensive maps, often employing machine learning methods, are appearing more and more frequently in the literature. However, to the wider community, it is not always transparent how these maps are made and how they should be interpreted. Furthermore, while these maps undoubtedly serve a decorative purpose in academic publications, it is not always apparent what extra information can be garnered from reading or making them.
This Account attempts to answer such questions. We start with a concise summary of the theory of representing chemical environments, followed by the introduction of a simple yet practical conceptual approach for generating structure maps in a generic and automated manner. Such analysis and mapping is made nearly effortless by employing the newly developed software tool ASAP. To showcase the applicability to a wide variety of systems in chemistry and materials science, we provide several illustrative examples, including crystalline and amorphous materials, interfaces, and organic molecules. In these examples, the maps not only help to sift through large data sets but also reveal hidden patterns that could be easily missed using conventional analyses.
The explosion in the amount of computed information in chemistry and materials science has made visualization into a science in itself. Not only have we benefited from exploiting these visualization methods in previous works, we also believe that the automated mapping of data sets will in turn stimulate further creativity and exploration, as well as ultimately feed back into future advances in the respective fields
Mapping Materials and Molecules.
The visualization of data is indispensable in scientific research, from the early stages when human insight forms to the final step of communicating results. In computational physics, chemistry and materials science, it can be as simple as making a scatter plot or as straightforward as looking through the snapshots of atomic positions manually. However, as a result of the "big data" revolution, these conventional approaches are often inadequate. The widespread adoption of high-throughput computation for materials discovery and the associated community-wide repositories have given rise to data sets that contain an enormous number of compounds and atomic configurations. A typical data set contains thousands to millions of atomic structures, along with a diverse range of properties such as formation energies, band gaps, or bioactivities.It would thus be desirable to have a data-driven and automated framework for visualizing and analyzing such structural data sets. The key idea is to construct a low-dimensional representation of the data, which facilitates navigation, reveals underlying patterns, and helps to identify data points with unusual attributes. Such data-intensive maps, often employing machine learning methods, are appearing more and more frequently in the literature. However, to the wider community, it is not always transparent how these maps are made and how they should be interpreted. Furthermore, while these maps undoubtedly serve a decorative purpose in academic publications, it is not always apparent what extra information can be garnered from reading or making them.This Account attempts to answer such questions. We start with a concise summary of the theory of representing chemical environments, followed by the introduction of a simple yet practical conceptual approach for generating structure maps in a generic and automated manner. Such analysis and mapping is made nearly effortless by employing the newly developed software tool ASAP. To showcase the applicability to a wide variety of systems in chemistry and materials science, we provide several illustrative examples, including crystalline and amorphous materials, interfaces, and organic molecules. In these examples, the maps not only help to sift through large data sets but also reveal hidden patterns that could be easily missed using conventional analyses.The explosion in the amount of computed information in chemistry and materials science has made visualization into a science in itself. Not only have we benefited from exploiting these visualization methods in previous works, we also believe that the automated mapping of data sets will in turn stimulate further creativity and exploration, as well as ultimately feed back into future advances in the respective fields
Machine-learning of atomic-scale properties based on physical principles
We briefly summarize the kernel regression approach, as used recently in
materials modelling, to fitting functions, particularly potential energy
surfaces, and highlight how the linear algebra framework can be used to both
predict and train from linear functionals of the potential energy, such as the
total energy and atomic forces. We then give a detailed account of the Smooth
Overlap of Atomic Positions (SOAP) representation and kernel, showing how it
arises from an abstract representation of smooth atomic densities, and how it
is related to several popular density-based representations of atomic
structure. We also discuss recent generalisations that allow fine control of
correlations between different atomic species, prediction and fitting of
tensorial properties, and also how to construct structural kernels---applicable
to comparing entire molecules or periodic systems---that go beyond an additive
combination of local environments
- …